Faster Least Squares Approximation

نویسندگان

  • Petros Drineas
  • Michael W. Mahoney
  • S. Muthukrishnan
  • Tamás Sarlós
چکیده

Least squares approximation is a technique to find an approximate solution to a system of linear equations that has no exact solution. In a typical setting, one lets n be the number of constraints and d be the number of variables, with n d. Then, existing exact methods find a solution vector in O(nd2) time. We present two randomized algorithms that provide accurate relative-error approximations to the optimal value and the solution vector of a least squares approximation problem more rapidly than existing exact algorithms. Both of our algorithms preprocess the data with the Randomized Hadamard transform. One then uniformly randomly samples constraints and solves the smaller problem on those constraints, and the other performs a sparse random projection and solves the smaller problem on those projected coordinates. In both cases, solving the smaller problem provides relative-error approximations, and, if n is sufficiently larger than d, the approximate solution can be computed in O(nd ln d) time. Mathematics Subject Classification (2000) 65F99 P. Drineas (B) Department of Computer Science, Rensselaer Polytechnic Institute, Troy, NY, USA e-mail: [email protected] M. W. Mahoney Department of Mathematics, Stanford University, Stanford, CA, USA e-mail: [email protected] S. Muthukrishnan Google, Inc., New York, NY, USA e-mail: [email protected] T. Sarlós Yahoo! Research, Sunnyvale, CA, USA e-mail: [email protected]

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Numerical solution of the spread of infectious diseases mathematical model based on shifted Bernstein polynomials

The Volterra delay integral equations have numerous applications in various branches of science, including biology, ecology, physics and modeling of engineering and natural sciences. In many cases, it is difficult to obtain analytical solutions of these equations. So, numerical methods as an efficient approximation method for solving Volterra delay integral equations are of interest to many res...

متن کامل

A meshless discrete Galerkin method for solving the universe evolution differential equations based on the moving least squares approximation

In terms of observational data, there are some problems in the standard Big Bang cosmological model. Inflation era, early accelerated phase of the evolution of the universe, can successfully solve these problems. The inflation epoch can be explained by scalar inflaton field. The evolution of this field is presented by a non-linear differential equation. This equation is considered in FLRW model...

متن کامل

Faster Convergence and Improved Performance in Least-Squares Training of Neural Networks for Active Sound Cancellation

This paper introduces new recursive least-squares algorithms with faster convergence and improved steady-state performance for the training of multilayer feedforward neural networks, used in a two neural networks structure for multichannel nonlinear active sound cancellation. Non-linearity in active sound cancellation systems is mostly found in actuators. The paper introduces the main concepts ...

متن کامل

Optimal Pareto Parametric Analysis of Two Dimensional Steady-State Heat Conduction Problems by MLPG Method

Numerical solutions obtained by the Meshless Local Petrov-Galerkin (MLPG) method are presented for two dimensional steady-state heat conduction problems. The MLPG method is a truly meshless approach, and neither the nodal connectivity nor the background mesh is required for solving the initial-boundary-value problem. The penalty method is adopted to efficiently enforce the essential boundary co...

متن کامل

ENTER TITLE HERE (14 pt type size, 10 words max, uppercase, bold, centered)

An approximation for the constant modulus (CM) cost function is proposed to allow the use of the fast recursive least squares (RLS) algorithm. Simulations are performed to compare the performance of the introduced RLS-CM and stochastic gradient descent (SGD) algorithms for blind adaptive beamforming. Results indicate that the introduced RLS-CM has faster convergence speed and good tracking abil...

متن کامل

A New Class of Incremental Gradient Methods1 for Least Squares Problems

The LMS method for linear least squares problems differs from the steepest descent method in that it processes data blocks one-by-one, with intermediate adjustment of the parameter vector under optimization. This mode of operation often leads to faster convergence when far from the eventual limit, and to slower (sublinear) convergence when close to the optimal solution. We embed both LMS and st...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Numerische Mathematik

دوره 117  شماره 

صفحات  -

تاریخ انتشار 2011